Discrete and continuous representations and processing in deep learning: Looking forward
نویسندگان
چکیده
Discrete and continuous representations of content (e.g., language or images) have interesting properties to be explored for the understanding reasoning with this by machines. This position paper puts forward our opinion on role discrete their processing in deep learning field. Current neural network models compute continuous-valued data. Information is compressed into dense, distributed embeddings. By stark contrast, humans use symbols communication language. Such represent a version world that derives its meaning from shared contextual information. Additionally, human involves symbol manipulation at cognitive level, which facilitates abstract reasoning, composition knowledge understanding, generalization efficient learning. Motivated these insights, we argue combining will essential build systems exhibit general form intelligence. We suggest discuss several avenues could improve current networks inclusion elements combine advantages both types representations.
منابع مشابه
Deep Learning of Representations: Looking Forward
Deep learning research aims at discovering learning algorithms that discover multiple levels of distributed representations, with higher levels representing more abstract concepts. Although the study of deep learning has already led to impressive theoretical results, learning algorithms and breakthrough experiments, several challenges lie ahead. This paper proposes to examine some of these chal...
متن کاملDeep Learning and Continuous Representations for Natural Language Processing
Deep learning techniques have demonstrated tremendous success in the speech and language processing community in recent years, establishing new state-ofthe-art performance in speech recognition, language modeling, and have shown great potential for many other natural language processing tasks. The focus of this tutorial is to provide an extensive overview on recent deep learning approaches to p...
متن کاملJoint-VAE: Learning Disentangled Joint Continuous and Discrete Representations
We present a framework for learning disentangled and interpretable jointly continuous and discrete representations in an unsupervised manner. By augmenting the continuous latent distribution of variational autoencoders with a relaxed discrete distribution and controlling the amount of information encoded in each latent unit, we show how continuous and categorical factors of variation can be dis...
متن کاملassessment of deep word knowledge in elementary and advanced iranian efl learners: a comparison of selective and productive wat tasks
testing plays a vital role in any language teaching program. it allows teachers and stakeholders, including program administrators, parents, admissions officers and prospective employers to be assured that the learners are progressing according to an accepted standard (douglas, 2010). the problems currently facing language testers have both practical and theoretical implications but the first i...
Looking back and looking forward
This year Molecular Genetics & Genomic Medicine (MGGM) turns 5 years old. For the editors of this journal it is time for some reflection, “looking back and looking forward”. The first issue appeared in May of 2013, 60 years after the publication by Watson and Crick’s discovery of the double-helical structure of DNA and 10 years after the completion of the Human Genome Project. The work for this...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: AI open
سال: 2021
ISSN: ['2666-6510']
DOI: https://doi.org/10.1016/j.aiopen.2021.07.002